Modified Divergences for Gaussian Densities

نویسندگان

  • Karim T. Abou-Moustafa
  • Frank P. Ferrie
چکیده

Multivariate Gaussian densities are pervasive in pattern recognition and machine learning. A central operation that appears in most of these areas is to measure the difference between two multivariate Gaussians. Unfortunately, traditional measures based on the Kullback–Leibler (KL) divergence and the Bhattacharyya distance do not satisfy all metric axioms necessary for many algorithms. In this paper we propose a modification for the KL divergence and the Bhattacharyya distance, for multivariate Gaussian densities, that transforms the two measures into distance metrics. Next, we show how these metric axioms impact the unfolding process of manifold learning algorithms. Finally, we illustrate the efficacy of the proposed metrics on two different manifold learning algorithms when used for motion clustering in video data. Our results show that, in this particular application, the new proposed metrics lead to significant boosts in performance (at least 7%) when compared to other divergence measures.

برای دانلود رایگان متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

ar X iv : 0 80 8 . 26 65 v 2 [ he p - ph ] 4 S ep 2 00 8 Rapidity divergences and valid definitions of parton densities

Rapidity divergences occur when parton densities in a gauge theory are defined in the most natural way, as expectation values of partonic number operators in light-front quantization. I review these and other related divergences, and show how the definitions of parton densities can be modified to remove the divergences. A modified definition is not only essential for many phenomenological appli...

متن کامل

Rapidity divergences and valid definitions of parton densities

Rapidity divergences occur when parton densities in a gauge theory are defined in the most natural way, as expectation values of partonic number operators in light-front quantization. I review these and other related divergences, and show how the definitions of parton densities can be modified to remove the divergences. A modified definition is not only essential for many phenomenological appli...

متن کامل

Log-Determinant Divergences Revisited: Alpha-Beta and Gamma Log-Det Divergences

In this paper, we review and extend a family of log-det divergences for symmetric positive definite (SPD) matrices and discuss their fundamental properties. We show how to generate from parameterized Alpha-Beta (AB) and Gamma log-det divergences many well known divergences, for example, the Stein’s loss, S-divergence, called also Jensen-Bregman LogDet (JBLD) divergence, the Logdet Zero (Bhattac...

متن کامل

Investigation to Reliability of Optical Communication Links using Auto-Track Subsystems in Presence of Different Beam Divergences

In this paper, we investigate the effects of auto-tracking subsystem together with different beam divergences on SNR, BER and stability of FSO communication links. For this purpose we compute the values of power, SNR and BER on receiver, based on analytic formula of Gaussian beam on receiver plane. In this computation the atmospheric effects including absorption, scattering and turbulence are c...

متن کامل

Multivariate parametric density estimation based on the modified Cramér-von Mises distance

In this paper, a novel distance-based density estimation method is proposed, which considers the overall density function in the goodness-of-fit. In detail, the parameters of Gaussian mixture densities are estimated from samples, based on the distance of the cumulative distributions over the entire state space. Due to the ambiguous definition of the standard multivariate cumulative distribution...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2012